566 research outputs found

    Taxon ordering in phylogenetic trees: a workbench test

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Phylogenetic trees are an important tool for representing evolutionary relationships among organisms. In a phylogram or chronogram, the ordering of taxa is not considered meaningful, since complete topological information is given by the branching order and length of the branches, which are represented in the root-to-node direction. We apply a novel method based on a (λ + <it>μ</it>)-Evolutionary Algorithm to give meaning to the order of taxa in a phylogeny. This method applies random swaps between two taxa connected to the same node, without changing the topology of the tree. The evaluation of a new tree is based on different distance matrices, representing non-phylogenetic information such as other types of genetic distance, geographic distance, or combinations of these. To test our method we use published trees of Vesicular stomatitis virus, West Nile virus and Rice yellow mottle virus.</p> <p>Results</p> <p>Best results were obtained when taxa were reordered using geographic information. Information supporting phylogeographic analysis was recovered in the optimized tree, as evidenced by clustering of geographically close samples. Improving the trees using a separate genetic distance matrix altered the ordering of taxa, but not topology, moving the longest branches to the extremities, as would be expected since they are the most divergent lineages. Improved representations of genetic and geographic relationships between samples were also obtained when merged matrices (genetic and geographic information in one matrix) were used.</p> <p>Conclusions</p> <p>Our innovative method makes phylogenetic trees easier to interpret, adding meaning to the taxon order and helping to prevent misinterpretations.</p

    Research of Image Retrieval Technology Based on Annular Moments and Fuzzy Clustering

    Get PDF
    自20世纪70年代以来,在数据库系统和计算机视觉两大研究领域的共同推动下,图像检索技术已逐渐成为一个研究热点。本文在系统学习和掌握基于内容的图像检索领域相关知识的基础上,主要针对图像特征的提取表达,高维图像特征的模糊聚类,相关反馈等技术进行了深入研究,分别提出了相应算法。论文的主要工作包括基于颜色特征和纹理特征的图像检索方法研究,并在已有理论的基础上,针对具体的图像数据库,设计了一个基于环形矩和模糊聚类的图像检索实验系统,并使用Corel图像库对以上检索方法的有效性进行分析和比较。本文主要创新点如下: (1)为了提高检索性能,本文提出一种综合颜色和纹理特征的图像检索方法。在颜色特征方面,提出...Image is an important information carrier, and also an important part of multimedia information. With rapid development of Internet and technologies of computer and multimedia, the number of image data grows in geometric progression. Therefore, how to effectively manage and retrieve large-scale images became a critical problem.Image retrieval has been an active research area since the 1970s, with ...学位:工学硕士院系专业:软件学院_计算机软件与理论学号:2432007115182

    Parton distributions: determining probabilities in a space of functions

    Full text link
    We discuss the statistical properties of parton distributions within the framework of the NNPDF methodology. We present various tests of statistical consistency, in particular that the distribution of results does not depend on the underlying parametrization and that it behaves according to Bayes' theorem upon the addition of new data. We then study the dependence of results on consistent or inconsistent datasets and present tools to assess the consistency of new data. Finally we estimate the relative size of the PDF uncertainty due to data uncertainties, and that due to the need to infer a functional form from a finite set of data.Comment: 11 pages, 8 figures, presented by Stefano Forte at PHYSTAT 2011 (to be published in the proceedings

    Indirect composite restorations luted with two different procedures: a ten years follow up clinical trial

    Get PDF
    Objectives: The aim of this clinical trial was to evaluate posterior indirect composite resin restoration ten years after placement luted with two different procedures. Study Design: In 23 patients 22 inlays/onlays (Group A) were luted using a dual-cured resin composite cement and 26 inlays/onlays (Group B) were luted using a light cured resin composite for a total of 48 Class I and Class II indirect composite resin inlays and onlays. The restorations were evaluated at 2 time points: 1) one week after placement (baseline evaluation) and 2) ten years after placement using the modified USPHS criteria. The Mann- Whitney and the Wilcoxon tests were used to examine the difference between the results of the baseline and 10 years evaluation for each criteria. Results: Numerical but not statistically significant differences were noted on any of the recorded clinical parameters ( p >0.05) between the inlay/onlays of Group A and Group B. 91% and 94 % of Group A and B respectively were rated as clinically acceptable in all the evaluated criteria ten years after clinical function. Conclusions: Within the limits of the study the results showed after ten years of function a comparable clinical performance of indirect composite resin inlays/onlays placed with a light cure or dual cure luting procedures

    Looking for Criminal Intents in JavaScript Obfuscated Code

    Get PDF
    The majority of websites incorporate JavaScript for client-side execution in a supposedly protected environment. Unfortunately, JavaScript has also proven to be a critical attack vector for both independent and state-sponsored groups of hackers. On the one hand, defenders need to analyze scripts to ensure that no threat is delivered and to respond to potential security incidents. On the other, attackers aim to obfuscate the source code in order to disorient the defenders or even to make code analysis practically impossible. Since code obfuscation may also be adopted by companies for legitimate intellectual-property protection, a dilemma remains on whether a script is harmless or malignant, if not criminal. To help analysts deal with such a dilemma, a methodology is proposed, called JACOB, which is based on five steps, namely: (1) source code parsing, (2) control flow graph recovery, (3) region identification, (4) code structuring, and (5) partial evaluation. These steps implement a sort of decompilation for control flow flattened code, which is progressively transformed into something that is close to the original JavaScript source, thereby making eventual code analysis possible. Most relevantly, JACOB has been successfully applied to uncover unwanted user tracking and fingerprinting in e-commerce websites operated by a well-known Chinese company
    corecore